150 research outputs found
Traffic flow reconstruction by solving indeterminacy on traffic distribution at junctions
Abstract The knowledge of the real time traffic flow status in each segment of a whole road network in a city or area is becoming fundamental for a large number of smart services such as: routing, planning, dynamic tuning services, healthy walk, etc. Rescue teams, police department, and ambulances need to know with high precision the status of the network in real time. On the other hand, the costs to obtain this information either with direct measures meant to add instruments on the whole network or acquiring data from international providers such as Google, TomTom, etc. is very high. The traditional modeling and computing approaches are not satisfactory since they are based on many assumptions that typically are doomed to change over time, as it occurs with traffic distribution at junctions; in short they cannot cover the whole network with the needed precision. In this paper, the above problem has been addressed providing a solution granting any traffic flow reconstruction with high precision and solving the indeterminacy of traffic distribution at junctions for large networks. The identified solution can be classified as a stochastic relaxation technique and resulted affordable on a parallel architecture based on GPU. The result has been obtained in the framework of the Sii-Mobility national project on smart city transport systems in Italy, a very large research project, and it is at present exploited in a number of cities/regions across Europe and by a number of research projects (Snap4City, TRAFAIR) of the European Commission
Performance assessment of RDF graph databases for smart city services
Abstract Smart cities are providing advanced services aggregating and exploiting data from different sources. Cities collect static data such as road graphs, service description, as well as dynamic/real time data like weather forecast, traffic sensors, bus positions, city sensors, events, emergency data, flows, etc. RDF stores may be used to set up knowledge bases integrating heterogeneous information for web and mobile applications to use the data for new advanced services to citizens and city administrators, thus exploiting inferential capabilities, temporal and spatial reasoning, and text indexing. In this paper, the needs and constraints for RDF stores to be used for smart cities services, together with the currently available RDF stores are evaluated. The assessment model allows a full understanding of whether an RDF store is suitable to be used as a basis for Smart City modeling and applications. The RDF assessment model is also supported by a benchmark which extends available RDF store benchmarks at the state the art. The comparison of the RDF stores has been applied on a number of well-known RDF stores as Virtuoso, GraphDB (former OWLIM), Oracle, StarDog, and many others. The paper also reports the adoption of the proposed Smart City RDF Benchmark on the basis of Florence Smart City model, data sets and tools accessible as Km4City Http://www.Km4City.org , and adopted in the European Commission international smart city projects named RESOLUTE H2020, REPLICATE H2020, and in Sii-Mobility National Smart City project in Italy
Km4City Ontology Building vs Data Harvesting and Cleaning for Smart-city Services
Presently, a very large number of public and private data sets are available
from local governments. In most cases, they are not semantically interoperable
and a huge human effort would be needed to create integrated ontologies and
knowledge base for smart city. Smart City ontology is not yet standardized, and
a lot of research work is needed to identify models that can easily support the
data reconciliation, the management of the complexity, to allow the data
reasoning. In this paper, a system for data ingestion and reconciliation of
smart cities related aspects as road graph, services available on the roads,
traffic sensors etc., is proposed. The system allows managing a big data volume
of data coming from a variety of sources considering both static and dynamic
data. These data are mapped to a smart-city ontology, called KM4City (Knowledge
Model for City), and stored into an RDF-Store where they are available for
applications via SPARQL queries to provide new services to the users via
specific applications of public administration and enterprises. The paper
presents the process adopted to produce the ontology and the big data
architecture for the knowledge base feeding on the basis of open and private
data, and the mechanisms adopted for the data verification, reconciliation and
validation. Some examples about the possible usage of the coherent big data
knowledge base produced are also offered and are accessible from the RDF-Store
and related services. The article also presented the work performed about
reconciliation algorithms and their comparative assessment and selection
- …